You'll get new warnings about sensitive content.News 

Apple’s iOS 17 Enhances Security Against Unwanted Nudity

Apple’s iOS 17 simplifies content sharing while introducing measures to prevent misuse. The tech giant has announced that its forthcoming software will incorporate a Sensitive Content Warning function, which assists adults in avoiding unsolicited nude images and videos. If you receive something that may be worrying, you can either reject it, accept it, or discover ways to seek assistance.

Communication security also protects children outside the Viestit application. The feature uses machine learning to detect and mask sexual content sent and received via AirDrop, contact stickers, FaceTime messages, and photo selections. The technology can now recognize both video clips and still images. If this content hits home, kids can message trusted adults for help or find helpful resources.

Both the sensitive content warning and the communication security process arguments on the device. Apple also says it has no access to the materials. Communication protection requires that Family Sharing is enabled and that certain accounts are marked as belonging to children.

Apple has announced plans to reduce unwanted nudity in 2021, as well as notify users of photos uploaded to iCloud if they contain known child sexual abuse (CSAM) material. The company canceled the plan at the end of 2022 due to fears that governments might pressure it to search for other types of images, not to mention the risks of false positives. Connection security and warning about sensitive content don’t have these issues – they’re just there to prevent crawling from upsetting others.

Lawmakers are working to criminalize unwanted nudity, and individual services have their own anti-nudity tools. In light of this, Apple is essentially filling gaps in the deterrence system. In theory, shady characters wouldn’t bombard iPhone users with rude texts and calls much.

Related posts

Leave a Comment